89 research outputs found

    askMEDLINE: a free-text, natural language query tool for MEDLINE/PubMed

    Get PDF
    BACKGROUND: Plain language search tools for MEDLINE/PubMed are few. We wanted to develop a search tool that would allow anyone using a free-text, natural language query and without knowing specialized vocabularies that an expert searcher might use, to find relevant citations in MEDLINE/PubMed. This tool would translate a question into an efficient search. RESULTS: The accuracy and relevance of retrieved citations were compared to references cited in BMJ POEMs and CATs (critically appraised topics) questions from the University of Michigan Department of Pediatrics. askMEDLINE correctly matched the cited references 75.8% in POEMs and 89.2 % in CATs questions on first pass. When articles that were deemed to be relevant to the clinical questions were included, the overall efficiency in retrieving journal articles was 96.8% (POEMs) and 96.3% (CATs.) CONCLUSION: askMEDLINE might be a useful search tool for clinicians, researchers, and other information seekers interested in finding current evidence in MEDLINE/PubMed. The text-only format could be convenient for users with wireless handheld devices and those with low-bandwidth connections in remote locations

    Bias in the journal impact factor

    Full text link
    The ISI journal impact factor (JIF) is based on a sample that may represent half the whole-of-life citations to some journals, but a small fraction (<10%) of the citations accruing to other journals. This disproportionate sampling means that the JIF provides a misleading indication of the true impact of journals, biased in favour of journals that have a rapid rather than a prolonged impact. Many journals exhibit a consistent pattern of citation accrual from year to year, so it may be possible to adjust the JIF to provide a more reliable indication of a journal's impact.Comment: 9 pages, 8 figures; one reference correcte

    Webometric analysis of departments of librarianship and information science: a follow-up study

    Get PDF
    This paper reports an analysis of the websites of UK departments of library and information science. Inlink counts of these websites revealed no statistically significant correlation with the quality of the research carried out by these departments, as quantified using departmental grades in the 2001 Research Assessment Exercise and citations in Google Scholar to publications submitted for that Exercise. Reasons for this lack of correlation include: difficulties in disambiguating departmental websites from larger institutional structures; the relatively small amount of research-related material in departmental websites; and limitations in the ways that current Web search engines process linkages to URLs. It is concluded that departmental-level webometric analyses do not at present provide an appropriate technique for evaluating academic research quality, and, more generally, that standards are needed for the formatting of URLs if inlinks are to become firmly established as a tool for website analysis

    Proof over promise: towards a more inclusive ranking of Dutch academics in Economics & Business

    Get PDF
    The Dutch Economics top-40, based on publications in ISI listed journals, is - to the best of our knowledge - the oldest ranking of individual academics in Economics and is well accepted in the Dutch academic community. However, this ranking is based on publication volume, rather than on the actual impact of the publications in question. This paper therefore uses two relatively new metrics, the citations per author per year (CAY) metric and the individual annual h-index (hIa) to provide two alternative, citation-based, rankings of Dutch academics in Economics & Business. As a data source, we use Google Scholar instead of ISI to provide a more comprehensive measure of impact, including citations to and from publications in non-ISI listed journals, books, working and conference papers. The resulting rankings are shown to be substantially different from the original ranking based on publications. Just like other research metrics, the CAY or hIa-index should never be used as the sole criterion to evaluate academics. However, we do argue that the hIa-index and the related citations per author per year metric provide an important additional perspective over and above a ranking based on publications in high impact journals alone. Citation-based rankings are also shown to inject a higher level of diversity in terms of age, gender, discipline and academic affiliation and thus appear to be more inclusive of a wider range of scholarship

    Impact Factor: outdated artefact or stepping-stone to journal certification?

    Full text link
    A review of Garfield's journal impact factor and its specific implementation as the Thomson Reuters Impact Factor reveals several weaknesses in this commonly-used indicator of journal standing. Key limitations include the mismatch between citing and cited documents, the deceptive display of three decimals that belies the real precision, and the absence of confidence intervals. These are minor issues that are easily amended and should be corrected, but more substantive improvements are needed. There are indications that the scientific community seeks and needs better certification of journal procedures to improve the quality of published science. Comprehensive certification of editorial and review procedures could help ensure adequate procedures to detect duplicate and fraudulent submissions.Comment: 25 pages, 12 figures, 6 table
    • …
    corecore